50 research outputs found

    Reducing vortex density in superconductors using the ratchet effect

    Full text link
    A serious obstacle that impedes the application of low and high temperature superconductor (SC) devices is the presence of trapped flux. Flux lines or vortices are induced by fields as small as the Earth's magnetic field. Once present, vortices dissipate energy and generate internal noise, limiting the operation of numerous superconducting devices. Methods used to overcome this difficulty include the pinning of vortices by the incorporation of impurities and defects, the construction of flux dams, slots and holes and magnetic shields which block the penetration of new flux lines in the bulk of the SC or reduce the magnetic field in the immediate vicinity of the superconducting device. Naturally, the most desirable would be to remove the vortices from the bulk of the SC. There is no known phenomenon, however, that could form the basis for such a process. Here we show that the application of an ac current to a SC that is patterned with an asymmetric pinning potential can induce vortex motion whose direction is determined only by the asymmetry of the pattern. The mechanism responsible for this phenomenon is the so called ratchet effect, and its working principle applies to both low and high temperature SCs. As a first step here we demonstrate that with an appropriate choice of the pinning potential the ratchet effect can be used to remove vortices from low temperature SCs in the parameter range required for various applications.Comment: 7 pages, 4 figures, Nature (in press

    Evolution of associative learning in chemical networks

    Get PDF
    Organisms that can learn about their environment and modify their behaviour appropriately during their lifetime are more likely to survive and reproduce than organisms that do not. While associative learning – the ability to detect correlated features of the environment – has been studied extensively in nervous systems, where the underlying mechanisms are reasonably well understood, mechanisms within single cells that could allow associative learning have received little attention. Here, using in silico evolution of chemical networks, we show that there exists a diversity of remarkably simple and plausible chemical solutions to the associative learning problem, the simplest of which uses only one core chemical reaction. We then asked to what extent a linear combination of chemical concentrations in the network could approximate the ideal Bayesian posterior of an environment given the stimulus history so far? This Bayesian analysis revealed the ’memory traces’ of the chemical network. The implication of this paper is that there is little reason to believe that a lack of suitable phenotypic variation would prevent associative learning from evolving in cell signalling, metabolic, gene regulatory, or a mixture of these networks in cells

    Programmability of Chemical Reaction Networks

    Get PDF
    Motivated by the intriguing complexity of biochemical circuitry within individual cells we study Stochastic Chemical Reaction Networks (SCRNs), a formal model that considers a set of chemical reactions acting on a finite number of molecules in a well-stirred solution according to standard chemical kinetics equations. SCRNs have been widely used for describing naturally occurring (bio)chemical systems, and with the advent of synthetic biology they become a promising language for the design of artificial biochemical circuits. Our interest here is the computational power of SCRNs and how they relate to more conventional models of computation. We survey known connections and give new connections between SCRNs and Boolean Logic Circuits, Vector Addition Systems, Petri Nets, Gate Implementability, Primitive Recursive Functions, Register Machines, Fractran, and Turing Machines. A theme to these investigations is the thin line between decidable and undecidable questions about SCRN behavior

    Avalanches in self-organized critical neural networks: A minimal model for the neural SOC universality class

    Full text link
    The brain keeps its overall dynamics in a corridor of intermediate activity and it has been a long standing question what possible mechanism could achieve this task. Mechanisms from the field of statistical physics have long been suggesting that this homeostasis of brain activity could occur even without a central regulator, via self-organization on the level of neurons and their interactions, alone. Such physical mechanisms from the class of self-organized criticality exhibit characteristic dynamical signatures, similar to seismic activity related to earthquakes. Measurements of cortex rest activity showed first signs of dynamical signatures potentially pointing to self-organized critical dynamics in the brain. Indeed, recent more accurate measurements allowed for a detailed comparison with scaling theory of non-equilibrium critical phenomena, proving the existence of criticality in cortex dynamics. We here compare this new evaluation of cortex activity data to the predictions of the earliest physics spin model of self-organized critical neural networks. We find that the model matches with the recent experimental data and its interpretation in terms of dynamical signatures for criticality in the brain. The combination of signatures for criticality, power law distributions of avalanche sizes and durations, as well as a specific scaling relationship between anomalous exponents, defines a universality class characteristic of the particular critical phenomenon observed in the neural experiments. The spin model is a candidate for a minimal model of a self-organized critical adaptive network for the universality class of neural criticality. As a prototype model, it provides the background for models that include more biological details, yet share the same universality class characteristic of the homeostasis of activity in the brain.Comment: 17 pages, 5 figure

    The Goldbeter-Koshland switch in the first-order region and its response to dynamic disorder

    Get PDF
    In their classical work (Proc. Natl. Acad. Sci. USA, 1981, 78:6840-6844), Goldbeter and Koshland mathematically analyzed a reversible covalent modification system which is highly sensitive to the concentration of effectors. Its signal-response curve appears sigmoidal, constituting a biochemical switch. However, the switch behavior only emerges in the "zero-order region", i.e. when the signal molecule concentration is much lower than that of the substrate it modifies. In this work we showed that the switching behavior can also occur under comparable concentrations of signals and substrates, provided that the signal molecules catalyze the modification reaction in cooperation. We also studied the effect of dynamic disorders on the proposed biochemical switch, in which the enzymatic reaction rates, instead of constant, appear as stochastic functions of time. We showed that the system is robust to dynamic disorder at bulk concentration. But if the dynamic disorder is quasi-static, large fluctuations of the switch response behavior may be observed at low concentrations. Such fluctuation is relevant to many biological functions. It can be reduced by either increasing the conformation interconversion rate of the protein, or correlating the enzymatic reaction rates in the network.Comment: 23 pages, 4 figures, accepted by PLOS ON

    Emergent complex neural dynamics

    Full text link
    A large repertoire of spatiotemporal activity patterns in the brain is the basis for adaptive behaviour. Understanding the mechanism by which the brain's hundred billion neurons and hundred trillion synapses manage to produce such a range of cortical configurations in a flexible manner remains a fundamental problem in neuroscience. One plausible solution is the involvement of universal mechanisms of emergent complex phenomena evident in dynamical systems poised near a critical point of a second-order phase transition. We review recent theoretical and empirical results supporting the notion that the brain is naturally poised near criticality, as well as its implications for better understanding of the brain

    Quantum dynamics in strong fluctuating fields

    Full text link
    A large number of multifaceted quantum transport processes in molecular systems and physical nanosystems can be treated in terms of quantum relaxation processes which couple to one or several fluctuating environments. A thermal equilibrium environment can conveniently be modelled by a thermal bath of harmonic oscillators. An archetype situation provides a two-state dissipative quantum dynamics, commonly known under the label of a spin-boson dynamics. An interesting and nontrivial physical situation emerges, however, when the quantum dynamics evolves far away from thermal equilibrium. This occurs, for example, when a charge transferring medium possesses nonequilibrium degrees of freedom, or when a strong time-dependent control field is applied externally. Accordingly, certain parameters of underlying quantum subsystem acquire stochastic character. Herein, we review the general theoretical framework which is based on the method of projector operators, yielding the quantum master equations for systems that are exposed to strong external fields. This allows one to investigate on a common basis the influence of nonequilibrium fluctuations and periodic electrical fields on quantum transport processes. Most importantly, such strong fluctuating fields induce a whole variety of nonlinear and nonequilibrium phenomena. A characteristic feature of such dynamics is the absence of thermal (quantum) detailed balance.Comment: review article, Advances in Physics (2005), in pres

    Brain Performance versus Phase Transitions

    Get PDF
    We here illustrate how a well-founded study of the brain may originate in assuming analogies with phase-transition phenomena. Analyzing to what extent a weak signal endures in noisy environments, we identify the underlying mechanisms, and it results a description of how the excitability associated to (non-equilibrium) phase changes and criticality optimizes the processing of the signal. Our setting is a network of integrate-and-fire nodes in which connections are heterogeneous with rapid time-varying intensities mimicking fatigue and potentiation. Emergence then becomes quite robust against wiring topology modification—in fact, we considered from a fully connected network to the Homo sapiens connectome—showing the essential role of synaptic flickering on computations. We also suggest how to experimentally disclose significant changes during actual brain operation.The authors acknowledge support from the Spanish Ministry of Economy and Competitiveness under the project FIS2013-43201-P

    A nucleotide binding rectification Brownian ratchet model for translocation of Y-family DNA polymerases

    Get PDF
    Y-family DNA polymerases are characterized by low-fidelity synthesis on undamaged DNA and ability to catalyze translesion synthesis over the damaged DNA. Their translocation along the DNA template is an important event during processive DNA synthesis. In this work we present a Brownian ratchet model for this translocation, where the directed translocation is rectified by the nucleotide binding to the polymerase. Using the model, different features of the available structures for Dpo4, Dbh and polymerase ι in binary and ternary forms can be easily explained. Other dynamic properties of the Y-family polymerases such as the fast translocation event upon dNTP binding for Dpo4 and the considerable variations of the processivity among the polymerases can also be well explained by using the model. In addition, some predicted results of the DNA synthesis rate versus the external force acting on Dpo4 and Dbh polymerases are presented. Moreover, we compare the effect of the external force on the DNA synthesis rate of the Y-family polymerase with that of the replicative DNA polymerase

    Engines and demons

    No full text
    corecore